48 research outputs found
Remixing physical objects through tangible tools
Thesis (S.M.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2011.This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.Cataloged from student submitted PDF version of thesis.Includes bibliographical references (p. 147-164).In this document we present new tools for remixing physical objects. These tools allow users to copy, edit and manipulate the properties of one or more objects to create a new physical object. We already have these capabilities using digital media: we can easily mash up videos, music and text. However, it remains difficult to remix physical objects and we cannot access the advantages of digital media, which are nondestructive, scalable and scriptable. We can bridge this gap by both integrating 2D and 3D scanning technology into design tools and employing aordable rapid prototyping technology to materialize these remixed objects. In so doing, we hope to promote copying as a tool for creation. This document presents two tools, CopyCAD and KidCAD, the first designed for makers and crafters, the second for children. CopyCAD is an augmented Computer Numerically Controlled (CNC) milling machine which allows users to copy arbitrary real world object geometry into 2D CAD designs at scale through the use of a camera-projector system. CopyCAD gathers properties from physical objects, sketches and touch interactions directly on a milling machine, allowing novice users to copy parts of real world objects, modify them and create a new physical part. KidCAD is a sculpting interface built on top of a gel-based realtime 2.5D scanner. It allows children to stamp objects into the block of gel, which are scanned in realtime, as if they were stamped into clay. Children can use everyday objects, their hands and tangible tools to design new toys or objects that will be 3D printed. This work enables novice users to easily approach designing physical objects by copying from other objects and sketching new designs. With increased access to such tools we hope that a wide range of people will be empowered to create their own objects, toys, tools and parts.by Sean Follmer.S.M
LineFORM: Actuated Curve Interfaces for Display, Interaction, and Constraint
In this paper we explore the design space of actuated curve interfaces, a novel class of shape changing-interfaces. Physical curves have several interesting characteristics from the perspective of interaction design: they have a variety of inherent affordances; they can easily represent abstract data; and they can act as constraints, boundaries, or borderlines. By utilizing such aspects of lines and curves, together with the added capability of shape-change, new possibilities for display, interaction and body constraint are possible. In order to investigate these possibilities we have implemented two actuated curve interfaces at different scales. LineFORM, our implementation, inspired by serpentine robotics, is comprised of a series chain of 1DOF servo motors with integrated sensors for direct manipulation. To motivate this work we present various applications such as shape changing cords, mobiles, body constraints, and data manipulation tools
Electroadhesive Auxetics as Programmable Layer Jamming Skins for Formable Crust Shape Displays
Shape displays are a class of haptic devices that enable whole-hand haptic
exploration of 3D surfaces. However, their scalability is limited by the
mechanical complexity and high cost of traditional actuator arrays. In this
paper, we propose using electroadhesive auxetic skins as a strain-limiting
layer to create programmable shape change in a continuous ("formable crust")
shape display. Auxetic skins are manufactured as flexible printed circuit
boards with dielectric-laminated electrodes on each auxetic unit cell (AUC),
using monolithic fabrication to lower cost and assembly time. By layering
multiple sheets and applying a voltage between electrodes on subsequent layers,
electroadhesion locks individual AUCs, achieving a maximum in-plane stiffness
variation of 7.6x with a power consumption of 50 uW/AUC. We first characterize
an individual AUC and compare results to a kinematic model. We then validate
the ability of a 5x5 AUC array to actively modify its own axial and transverse
stiffness. Finally, we demonstrate this array in a continuous shape display as
a strain-limiting skin to programmatically modulate the shape output of an
inflatable LDPE pouch. Integrating electroadhesion with auxetics enables new
capabilities for scalable, low-profile, and low-power control of flexible
robotic systems.Comment: Accepted to IEEE International Conference on Robotics and Automation
(ICRA 2023
deForm: An interactive malleable surface for capturing 2.5D arbitrary objects, tools and touch
We introduce a novel input device, deForm, that supports 2.5D touch gestures, tangible tools, and arbitrary objects through real-time structured light scanning of a malleable surface of interaction. DeForm captures high-resolution surface deformations and 2D grey-scale textures of a gel surface through a three-phase structured light 3D scanner. This technique can be combined with IR projection to allow for invisible capture, providing the opportunity for co-located visual feedback on the deformable surface. We describe methods for tracking fingers, whole hand gestures, and arbitrary tangible tools. We outline a method for physically encoding fiducial marker information in the height map of tangible tools. In addition, we describe a novel method for distinguishing between human touch and tangible tools, through capacitive sensing on top of the input surface. Finally we motivate our device through a number of sample applications
inFORM: Dynamic Physical Affordances and Constraints through Shape and Object Actuation
Past research on shape displays has primarily focused on rendering content and user interface elements through shape output, with less emphasis on dynamically changing UIs. We propose utilizing shape displays in three different ways to mediate interaction: to facilitate by providing dynamic physical affordances through shape change, to restrict by guiding users with dynamic physical constraints, and to manipulate by actuating physical objects. We outline potential interaction techniques and introduce Dynamic Physical Affordances and Constraints with our inFORM system, built on top of a state-of-the-art shape display, which provides for variable stiffness rendering and real-time user input through direct touch and tangible interaction. A set of motivating examples demonstrates how dynamic affordances, constraints and object actuation can create novel interaction possibilities.National Science Foundation (U.S.). Graduate Research Fellowship (Grant 1122374)Swedish Research Council (Fellowship)Blanceflor Foundation (Scholarship
Physical Telepresence: Shape Capture and Display for Embodied, Computer-mediated Remote Collaboration
We propose a new approach to Physical Telepresence, based on shared workspaces with the ability to capture and remotely render the shapes of people and objects. In this paper, we describe the concept of shape transmission, and propose interaction techniques to manipulate remote physical objects and physical renderings of shared digital content. We investigate how the representation of user's body parts can be altered to amplify their capabilities for teleoperation. We also describe the details of building and testing prototype Physical Telepresence workspaces based on shape displays. A preliminary evaluation shows how users are able to manipulate remote objects, and we report on our observations of several different manipulation techniques that highlight the expressive nature of our system.National Science Foundation (U.S.). Graduate Research Fellowship Program (Grant No. 1122374
Kinetic Blocks: Actuated Constructive Assembly for Interaction and Display
Pin-based shape displays not only give physical form to digital information, they have the inherent ability to accurately move and manipulate objects placed on top of them. In this paper we focus on such object manipulation: we present ideas and techniques that use the underlying shape change to give kinetic ability to otherwise inanimate objects. First, we describe the shape display's ability to assemble, disassemble, and reassemble structures from simple passive building blocks through stacking, scaffolding, and catapulting. A technical evaluation demonstrates the reliability of the presented techniques. Second, we introduce special kinematic blocks that are actuated and sensed through the underlying pins. These blocks translate vertical pin movements into other degrees of freedom like rotation or horizontal movement. This interplay of the shape display with objects on its surface allows us to render otherwise inaccessible forms, like overhangs, and enables richer input and output
Sublimate: State-Changing Virtual and Physical Rendering to Augment Interaction with Shape Displays
Recent research in 3D user interfaces pushes towards immersive graphics and actuated shape displays. Our work explores the hybrid of these directions, and we introduce sublimation and deposition, as metaphors for the transitions between physical and virtual states. We discuss how digital models, handles and controls can be interacted with as virtual 3D graphics or dynamic physical shapes, and how user interfaces can rapidly and fluidly switch between those representations. To explore this space, we developed two systems that integrate actuated shape displays and augmented reality (AR) for co-located physical shapes and 3D graphics. Our spatial optical see-through display provides a single user with head-tracked stereoscopic augmentation, whereas our handheld devices enable multi-user interaction through video seethrough AR. We describe interaction techniques and applications that explore 3D interaction for these new modalities. We conclude by discussing the results from a user study that show how freehand interaction with physical shape displays and co-located graphics can outperform wand-based interaction with virtual 3D graphics.National Science Foundation (U.S.) (Graduate Research Fellowship Grant 1122374